Robust variable selection for mixture linear regression models

نویسنده

  • Yunlu Jiang
چکیده

In this paper, we propose a robust variable selection to estimate and select relevant covariates for the finite mixture of linear regression models by assuming that the error terms follow a Laplace distribution to the data after trimming the high leverage points. We introduce a revised Expectation-maximization (EM) algorithm for numerical computation. Simulation studies indicate that the proposed method is robust to both the high leverage points and outliers in the y-direction, and can obtain a consistent variable selection in the case of outliers or heavy-tail error distribution. Finally, we apply the proposed methodology to analyze a real data.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An Overview of the New Feature Selection Methods in Finite Mixture of Regression Models

Variable (feature) selection has attracted much attention in contemporary statistical learning and recent scientific research. This is mainly due to the rapid advancement in modern technology that allows scientists to collect data of unprecedented size and complexity. One type of statistical problem in such applications is concerned with modeling an output variable as a function of a sma...

متن کامل

Model Selection for Mixture Models Using Perfect Sample

We have considered a perfect sample method for model selection of finite mixture models with either known (fixed) or unknown number of components which can be applied in the most general setting with assumptions on the relation between the rival models and the true distribution. It is, both, one or neither to be well-specified or mis-specified, they may be nested or non-nested. We consider mixt...

متن کامل

Package ‘ AICcmodavg ’ September 12 , 2013

Description This package includes functions to create model selection tables based on Akaike’s information criterion (AIC) and the second-order AIC (AICc), as well as their quasi-likelihood counterparts (QAIC, QAICc). Tables are printed with delta AIC and Akaike weights. The package also features functions to conduct classic model averaging (multimodel inference) for a given parameter of intere...

متن کامل

Penalized Bregman Divergence Estimation via Coordinate Descent

Variable selection via penalized estimation is appealing for dimension reduction. For penalized linear regression, Efron, et al. (2004) introduced the LARS algorithm. Recently, the coordinate descent (CD) algorithm was developed by Friedman, et al. (2007) for penalized linear regression and penalized logistic regression and was shown to gain computational superiority. This paper explores...

متن کامل

Robust portfolio selection with polyhedral ambiguous inputs

 Ambiguity in the inputs of the models is typical especially in portfolio selection problem where the true distribution of random variables is usually unknown. Here we use robust optimization approach to address the ambiguity in conditional-value-at-risk minimization model. We obtain explicit models of the robust conditional-value-at-risk minimization for polyhedral and correlated polyhedral am...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015